Improving on the Mixture of Experts Algorithm
نویسنده
چکیده
Difficult non-linear problems can be mapped to a set of localised linear problems. A self-organising map (SOM) is used as a gating function to a localised mixture of experts classifier and is shown to find solutions equivalent to those learned by a multi-layer perceptron while retaining the simplicity and resilience of a single-layer perceptron. Modifications to the traditional softmax gate function and expert training function are shown to produce more accurate classifiers that tend not to over-fit to training data.
منابع مشابه
Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملMixture of Experts for Persian handwritten word recognition
This paper presents the results of Persian handwritten word recognition based on Mixture of Experts technique. In the basic form of ME the problem space is automatically divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our proposed model, we used Mixture of Experts Multi Layered Perceptrons with Momentum term, in the classification ...
متن کاملImproving Phoneme Sequence Recognition using Phoneme Duration Information in DNN-HSMM
Improving phoneme recognition has attracted the attention of many researchers due to its applications in various fields of speech processing. Recent research achievements show that using deep neural network (DNN) in speech recognition systems significantly improves the performance of these systems. There are two phases in DNN-based phoneme recognition systems including training and testing. Mos...
متن کاملBoosted Pre-loaded Mixture of Experts for low-resolution face recognition
A modified version of Boosted Mixture of Experts (BME) for low-resolution face recognition is presented in this paper. Most of the methods developed for low-resolution face recognition focused on improving the resolution of face images and/or special feature extraction methods that can deal effectively with low-resolution problem. However, we focused on the classification step of face recogniti...
متن کاملExtended Mixture of MLP Experts by Hybrid of Conjugate Gradient Method and Modified Cuckoo Search
This paper investigates a new method for improving the learning algorithm of Mixture of Experts (ME) model using a hybrid of Modified Cuckoo Search (MCS) and Conjugate Gradient (CG) as a second order optimization technique. The CG technique is combined with Back-Propagation (BP) algorithm to yield a much more efficient learning algorithm for ME structure. In addition, the experts and gating net...
متن کامل